September 12th, 2019 - New York
ga_model to package up GA data modelling and presentationplotly for interactive plots, deployed via HTTP GETplumber deployed on Cloud Run for scaleThanks to Tim Wilson’s code on dartistics.com
library(googleAnalyticsR)
# a gar_model
ga_time_normalised <- function(viewID, interactive_plot=TRUE){
model <- ga_model_load("time-normalised.gamr")
ga_model(viewID, model,
first_day_pageviews_min = 2,
total_unique_pageviews_cutoff = 500,
days_live_range = 60,
page_filter_regex = ".*",
interactive_plot = interactive_plot)
}
Make an API out of your script:
#' @get /hello
#' @html
function(){
"<html><h1>hello world</h1></html>"
}
library(googleAnalyticsR)
#' Return output data from the ga_time_normalised ga_model
#' @param viewID The viewID for Google Analytics
#' @get /data
function(viewID=""){
model <- ga_time_normalised(viewID)
model$output
}
#' Plot out data from the ga_time_normalised ga_model
#' @param viewID The viewID for Google Analytics
#' @get /plot
#' @serializer htmlwidget
function(viewID=""){
model <- ga_time_normalised(viewID)
model$plot
}
library(plumber)
r <- plumb("api.R")
r$run(port=8000)
Creates a webserver to run the R code.
curl http://localhost:8000/data?viewID=81416156See cloudRunR
Based on:
FROM trestletech/plumber
LABEL maintainer="mark"
COPY [".", "./"]
ENTRYPOINT ["R", "-e",
"pr <- plumber::plumb(commandArgs()[4]);
pr$run(host='0.0.0.0',
port=as.numeric(Sys.getenv('PORT')))"]
CMD ["api.R"]
library(containerit)
dd <- dockerfile("api.R")
write(dd, "Dockerfile")
And add any packages needed by model.
library(googleCloudStorageR)
if(!is.null(googleAuthR::gar_gce_auth())){
gcs_get_object("ga-auth.json", bucket="your-bucket",
saveToDisk = "ga-auth.json", overwrite = TRUE)
}
library(googleAnalyticsR)
#..do calls..
Set up a build trigger for the GitHub repo you commit the Dockerfile to:
Can scale to a billion, and be available for other languages.
curl https://my-r-api-ewjogewawq-uc.a.run.app/data?viewID=81416156Keep in touch: